33 research outputs found

    Analyzing the Performance of Multilayer Neural Networks for Object Recognition

    Full text link
    In the last two years, convolutional neural networks (CNNs) have achieved an impressive suite of results on standard recognition datasets and tasks. CNN-based features seem poised to quickly replace engineered representations, such as SIFT and HOG. However, compared to SIFT and HOG, we understand much less about the nature of the features learned by large CNNs. In this paper, we experimentally probe several aspects of CNN feature learning in an attempt to help practitioners gain useful, evidence-backed intuitions about how to apply CNNs to computer vision problems.Comment: Published in European Conference on Computer Vision 2014 (ECCV-2014

    Generalized Phase Synchronization in unidirectionally coupled chaotic oscillators

    Full text link
    We investigate phase synchronization between two identical or detuned response oscillators coupled to a slightly different drive oscillator. Our result is that phase synchronization can occur between response oscillators when they are driven by correlated (but not identical) inputs from the drive oscillator. We call this phenomenon Generalized Phase Synchronization (GPS) and clarify its characteristics using Lyapunov exponents and phase difference plots.Comment: 4 pages, 5 figure

    Mixing Bandt-Pompe and Lempel-Ziv approaches: another way to analyze the complexity of continuous-states sequences

    Get PDF
    In this paper, we propose to mix the approach underlying Bandt-Pompe permutation entropy with Lempel-Ziv complexity, to design what we call Lempel-Ziv permutation complexity. The principle consists of two steps: (i) transformation of a continuous-state series that is intrinsically multivariate or arises from embedding into a sequence of permutation vectors, where the components are the positions of the components of the initial vector when re-arranged; (ii) performing the Lempel-Ziv complexity for this series of `symbols', as part of a discrete finite-size alphabet. On the one hand, the permutation entropy of Bandt-Pompe aims at the study of the entropy of such a sequence; i.e., the entropy of patterns in a sequence (e.g., local increases or decreases). On the other hand, the Lempel-Ziv complexity of a discrete-state sequence aims at the study of the temporal organization of the symbols (i.e., the rate of compressibility of the sequence). Thus, the Lempel-Ziv permutation complexity aims to take advantage of both of these methods. The potential from such a combined approach - of a permutation procedure and a complexity analysis - is evaluated through the illustration of some simulated data and some real data. In both cases, we compare the individual approaches and the combined approach.Comment: 30 pages, 4 figure

    Event related potentials to digit learning: Tracking neurophysiologic changes accompanying recall performanceModelling of auditory evoked potentials of human sleep-wake states

    No full text
    The aim of this study was to track recall performance and event-related potentials (ERPs) across multiple trials in a digit-learning task. When a sequence is practiced by repetition, the number of errors typically decreases and a learning curve emerges. Until now, almost all ERP learning and memory research has focused on effects after a single presentation and, therefore, fails to capture the dynamic changes that characterize a learning process. However, the current study used a free-recall task in which a sequence of ten auditory digits was presented repeatedly. Auditory sequences of ten digits were presented in a logical order (control sequences) or in a random order (experimental sequences). Each sequence was presented six times. Participants had to reproduce the sequence after each presentation. EEG recordings were made at the time of the digit presentations. Recall performance for the control sequences was close to asymptote right after the first learning trial, whereas performance for the experimental sequences initially displayed primacy and recency effects. However, these latter effects gradually disappeared over the six repetitions, resulting in near-asymptotic recall performance for all digits. The performance improvement for the middle items of the list was accompanied by an increase in P300 amplitude, implying a close correspondence between this ERP component and the behavioral data. These results, which were discussed in the framework of theories on the functional significance of the P300 amplitude, add to the scarce empirical data on the dynamics of ERP responses in the process of intentional learning

    When shared concept cells support associations: Theory of overlapping memory engrams

    No full text
    Assemblies of neurons, called concepts cells, encode acquired concepts in human Medial Temporal Lobe. Those concept cells that are shared between two assemblies have been hypothesized to encode associations between concepts. Here we test this hypothesis in a computational model of attractor neural networks. We find that for concepts encoded in sparse neural assemblies there is a minimal fraction cmin of neurons shared between assemblies below which associations cannot be reliably implemented; and a maximal fraction cmax of shared neurons above which single concepts can no longer be retrieved. In the presence of a periodically modulated background signal, such as hippocampal oscillations, recall takes the form of association chains reminiscent of those postulated by theories of free recall of words. Predictions of an iterative overlap-generating model match experimental data on the number of concepts to which a neuron responds.</p
    corecore